52 research outputs found

    Student Perspectives on E-Learning in a Malaysian Medical College One Year into the COVID-19 Pandemic: A Cross-Sectional Study

    Get PDF
    Background: During the coronavirus disease 2019 (COVID-19) pandemic, physical in-person classes in Newcastle University Medicine Malaysia (NUMed) were replaced with e-learning. Teaching in NUMed was being delivered online during COVID-19 pandemic for the last one and a half academic years (18 March 2020 – June 2021) due to the strict lockdowns and physical distancing measures in place, limited in-person sessions on campus, and disrupted hospital attachments. There was concern over the effectiveness and satisfaction with e-learning amongst our students, and how this affects their overall academic performance. Our study aims to explore students’ e-learning experiences and its perceived benefits and challenges during the pandemic. Methods: 285 students participated in this cross-sectional study using a convenience sampling method. Participants completed a self-administered online questionnaire via an email invitation between July 12th and August 12th, 2021 which comprised of sociodemographic characteristics and experiences with e-learning. The data was analysed using descriptive statistics and Spearman’s correlation tests were used to identify correlation between students’ e-learning experiences, the effects of e-learning, and academic performance. Results: Most students used laptops (n=275, 96.5%) for e-learning and owned at least two electronic devices (n=245, 86%). Over half of our students (n=148, 51.9%) reported their theoretical knowledge remains unchanged, while about three-quarters (n=213, 74.7%) perceived practical skills to have worsened. Students preferred paper-based exams (n=170, 59.6%) and objectively formatted online exams (n=193, 67.7%). Since transitioning to e-learning, the majority of students (n=207, 72.6%) reported difficulties studying online and were unsatisfied with their academic performance (n=166, 58.2%). Students preferred e-learning due to the lower risk of contracting COVID-19 (n=256, 89.8%), the convenience of online classes (n=244, 85.6%) and flexible schedules (n=219, 76.8%). However, the lack of patient contact (n=236, 82.8%), lecturer and peer interactions (n=234, 82.1%), and unreliable internet (n=201, 70.5%) made e-learning challenging. Students’ experiences were generally affected by multiple factors encompassing personal, lecturer, and environmental aspects. Conclusion: E-learning during the COVID-19 pandemic has negatively impacted students’ academic performance, particularly practical skills and performance satisfaction. Therefore, the university should look towards addressing e-learning constraints and providing adequate support to improve students’ educational experiences in the ongoing pandemic

    Transport genes and chemotaxis in Laribacter hongkongensis: a genome-wide analysis

    Get PDF
    <p>Abstract</p> <p>Background</p> <p><it>Laribacter hongkongensis </it>is a Gram-negative, sea gull-shaped rod associated with community-acquired gastroenteritis. The bacterium has been found in diverse freshwater environments including fish, frogs and drinking water reservoirs. Using the complete genome sequence data of <it>L. hongkongensis</it>, we performed a comprehensive analysis of putative transport-related genes and genes related to chemotaxis, motility and quorum sensing, which may help the bacterium adapt to the changing environments and combat harmful substances.</p> <p>Results</p> <p>A genome-wide analysis using Transport Classification Database TCDB, similarity and keyword searches revealed the presence of a large diversity of transporters (n = 457) and genes related to chemotaxis (n = 52) and flagellar biosynthesis (n = 40) in the <it>L. hongkongensis </it>genome. The transporters included those from all seven major transporter categories, which may allow the uptake of essential nutrients or ions, and extrusion of metabolic end products and hazardous substances. <it>L. hongkongensis </it>is unique among closely related members of <it>Neisseriaceae </it>family in possessing higher number of proteins related to transport of ammonium, urea and dicarboxylate, which may reflect the importance of nitrogen and dicarboxylate metabolism in this assacharolytic bacterium. Structural modeling of two C<sup><sub>4</sub></sup>-dicarboxylate transporters showed that they possessed similar structures to the determined structures of other DctP-TRAP transporters, with one having an unusual disulfide bond. Diverse mechanisms for iron transport, including hemin transporters for iron acquisition from host proteins, were also identified. In addition to the chemotaxis and flagella-related genes, the <it>L. hongkongensis </it>genome also contained two copies of <it>qseB/qseC </it>homologues of the AI-3 quorum sensing system.</p> <p>Conclusions</p> <p>The large number of diverse transporters and genes involved in chemotaxis, motility and quorum sensing suggested that the bacterium may utilize a complex system to adapt to different environments. Structural modeling will provide useful insights on the transporters in <it>L. hongkongensis</it>.</p

    Evaluation of individual and ensemble probabilistic forecasts of COVID-19 mortality in the United States

    Get PDF
    Short-term probabilistic forecasts of the trajectory of the COVID-19 pandemic in the United States have served as a visible and important communication channel between the scientific modeling community and both the general public and decision-makers. Forecasting models provide specific, quantitative, and evaluable predictions that inform short-term decisions such as healthcare staffing needs, school closures, and allocation of medical supplies. Starting in April 2020, the US COVID-19 Forecast Hub (https://covid19forecasthub.org/) collected, disseminated, and synthesized tens of millions of specific predictions from more than 90 different academic, industry, and independent research groups. A multimodel ensemble forecast that combined predictions from dozens of groups every week provided the most consistently accurate probabilistic forecasts of incident deaths due to COVID-19 at the state and national level from April 2020 through October 2021. The performance of 27 individual models that submitted complete forecasts of COVID-19 deaths consistently throughout this year showed high variability in forecast skill across time, geospatial units, and forecast horizons. Two-thirds of the models evaluated showed better accuracy than a naĂŻve baseline model. Forecast accuracy degraded as models made predictions further into the future, with probabilistic error at a 20-wk horizon three to five times larger than when predicting at a 1-wk horizon. This project underscores the role that collaboration and active coordination between governmental public-health agencies, academic modeling teams, and industry partners can play in developing modern modeling capabilities to support local, state, and federal response to outbreaks

    Hyperoxemia and excess oxygen use in early acute respiratory distress syndrome : Insights from the LUNG SAFE study

    Get PDF
    Publisher Copyright: © 2020 The Author(s). Copyright: Copyright 2020 Elsevier B.V., All rights reserved.Background: Concerns exist regarding the prevalence and impact of unnecessary oxygen use in patients with acute respiratory distress syndrome (ARDS). We examined this issue in patients with ARDS enrolled in the Large observational study to UNderstand the Global impact of Severe Acute respiratory FailurE (LUNG SAFE) study. Methods: In this secondary analysis of the LUNG SAFE study, we wished to determine the prevalence and the outcomes associated with hyperoxemia on day 1, sustained hyperoxemia, and excessive oxygen use in patients with early ARDS. Patients who fulfilled criteria of ARDS on day 1 and day 2 of acute hypoxemic respiratory failure were categorized based on the presence of hyperoxemia (PaO2 > 100 mmHg) on day 1, sustained (i.e., present on day 1 and day 2) hyperoxemia, or excessive oxygen use (FIO2 ≄ 0.60 during hyperoxemia). Results: Of 2005 patients that met the inclusion criteria, 131 (6.5%) were hypoxemic (PaO2 < 55 mmHg), 607 (30%) had hyperoxemia on day 1, and 250 (12%) had sustained hyperoxemia. Excess FIO2 use occurred in 400 (66%) out of 607 patients with hyperoxemia. Excess FIO2 use decreased from day 1 to day 2 of ARDS, with most hyperoxemic patients on day 2 receiving relatively low FIO2. Multivariate analyses found no independent relationship between day 1 hyperoxemia, sustained hyperoxemia, or excess FIO2 use and adverse clinical outcomes. Mortality was 42% in patients with excess FIO2 use, compared to 39% in a propensity-matched sample of normoxemic (PaO2 55-100 mmHg) patients (P = 0.47). Conclusions: Hyperoxemia and excess oxygen use are both prevalent in early ARDS but are most often non-sustained. No relationship was found between hyperoxemia or excessive oxygen use and patient outcome in this cohort. Trial registration: LUNG-SAFE is registered with ClinicalTrials.gov, NCT02010073publishersversionPeer reviewe

    Open data from the third observing run of LIGO, Virgo, KAGRA and GEO

    Get PDF
    The global network of gravitational-wave observatories now includes five detectors, namely LIGO Hanford, LIGO Livingston, Virgo, KAGRA, and GEO 600. These detectors collected data during their third observing run, O3, composed of three phases: O3a starting in April of 2019 and lasting six months, O3b starting in November of 2019 and lasting five months, and O3GK starting in April of 2020 and lasting 2 weeks. In this paper we describe these data and various other science products that can be freely accessed through the Gravitational Wave Open Science Center at https://gwosc.org. The main dataset, consisting of the gravitational-wave strain time series that contains the astrophysical signals, is released together with supporting data useful for their analysis and documentation, tutorials, as well as analysis software packages.Comment: 27 pages, 3 figure

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Using high-resolution contact networks to evaluate SARS-CoV-2 transmission and control in large-scale multi-day events

    Get PDF
    The emergence of highly transmissible SARS-CoV-2 variants has created a need to reassess the risk posed by increasing social contacts as countries resume pre-pandemic activities, particularly in the context of resuming large-scale events over multiple days. To examine how social contacts formed in different activity settings influences interventions required to control Delta variant outbreaks, we collected high-resolution data on contacts among passengers and crew on cruise ships and combined the data with network transmission models. We found passengers had a median of 20 (IQR 10–36) unique close contacts per day, and over 60% of their contact episodes were made in dining or sports areas where mask wearing is typically limited. In simulated outbreaks, we found that vaccination coverage and rapid antigen tests had a larger effect than mask mandates alone, indicating the importance of combined interventions against Delta to reduce event risk in the vaccine era

    PANC Study (Pancreatitis: A National Cohort Study): national cohort study examining the first 30 days from presentation of acute pancreatitis in the UK

    Get PDF
    Abstract Background Acute pancreatitis is a common, yet complex, emergency surgical presentation. Multiple guidelines exist and management can vary significantly. The aim of this first UK, multicentre, prospective cohort study was to assess the variation in management of acute pancreatitis to guide resource planning and optimize treatment. Methods All patients aged greater than or equal to 18 years presenting with acute pancreatitis, as per the Atlanta criteria, from March to April 2021 were eligible for inclusion and followed up for 30 days. Anonymized data were uploaded to a secure electronic database in line with local governance approvals. Results A total of 113 hospitals contributed data on 2580 patients, with an equal sex distribution and a mean age of 57 years. The aetiology was gallstones in 50.6 per cent, with idiopathic the next most common (22.4 per cent). In addition to the 7.6 per cent with a diagnosis of chronic pancreatitis, 20.1 per cent of patients had a previous episode of acute pancreatitis. One in 20 patients were classed as having severe pancreatitis, as per the Atlanta criteria. The overall mortality rate was 2.3 per cent at 30 days, but rose to one in three in the severe group. Predictors of death included male sex, increased age, and frailty; previous acute pancreatitis and gallstones as aetiologies were protective. Smoking status and body mass index did not affect death. Conclusion Most patients presenting with acute pancreatitis have a mild, self-limiting disease. Rates of patients with idiopathic pancreatitis are high. Recurrent attacks of pancreatitis are common, but are likely to have reduced risk of death on subsequent admissions. </jats:sec

    Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19

    Get PDF
    IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19. Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19. DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022). INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days. MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes. RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively). CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570

    Perception of facial emotions.

    No full text
    Facial emotions are the main expressers and communicators of internal state and affect. Specific domains of perception of facial emotions are examined – biological aspects, cultural aspects and changes throughout the lifespan. Following that, we will explore research on dysfunction and abnormalities in perception of facial emotions that accompany a variety of neuropsychological conditions – schizophrenia, Alzheimer’s disease and autism spectrum disorder. Finally, we will look into face viewpoint and gaze direction and its influence on perception of facial emotions. Conclusions are drawn about the linkages between the different domains.Bachelor of Art
    • 

    corecore